A Fast Incremental Algorithm for Low Rank Approximations of Matrices and Its Applications in Facial Images
نویسندگان
چکیده
This paper presents a fast incremental algorithm for low rank approximations or dimensionality reduction of matrices. Assuming that matrices have double-sided type of decomposition, we can set up an incremental solution that constitutes two coupled eigenmodels and thus a two-step updating procedure. At each step, we first represent row-row or column-column covariance matrix as the form of eigen-decomposition and then orthogonally decompose new available matrix along existing eigenspaces in order to obtain the more compact representation of updated row-row or column-column covariance matrix. Thus, the eigenmodel can be updated properly by solving an eigenvalue problem with a smaller number of eigenvalues. The algorithm is then applied to perform the tasks of both image reconstruction on facial image databases and face tracking on videos. These examples are used to provide extensive illustrations of the algorithm’s performance.
منابع مشابه
Distributed and Cooperative Compressive Sensing Recovery Algorithm for Wireless Sensor Networks with Bi-directional Incremental Topology
Recently, the problem of compressive sensing (CS) has attracted lots of attention in the area of signal processing. So, much of the research in this field is being carried out in this issue. One of the applications where CS could be used is wireless sensor networks (WSNs). The structure of WSNs consists of many low power wireless sensors. This requires that any improved algorithm for this appli...
متن کاملInexact and incremental bilinear Lanczos components algorithms for high dimensionality reduction and image reconstruction
Recently, matrix-based methods have gained wide attentions in pattern recognition and machine learning communities. The generalized low rank approximations of matrices (GLRAM) and the bilinear Lanczos components (BLC) algorithm are two popular algorithms that treat data as the native twodimensional matrix patterns. However, these two algorithms often require heavy computation time and memory sp...
متن کاملLow Rank Approximation using Error Correcting Coding Matrices
Low-rank matrix approximation is an integral component of tools such as principal component analysis (PCA), as well as is an important instrument used in applications like web search, text mining and computer vision, e.g., face recognition. Recently, randomized algorithms were proposed to effectively construct low rank approximations of large matrices. In this paper, we show how matrices from e...
متن کاملFast Computation of Convolution Operations via Low-Rank Approximation
Methods for the approximation of 2D discrete convolution operations are derived for the case when a low-rank approximation of one of the input matrices is available. Algorithms based on explicit computation and on the Fast Fourier Transform are described. Applications to the computation of cross-correlation and autocorrelation are discussed. Both theory and numerical experiments show that the u...
متن کاملOnline Streaming Feature Selection Using Geometric Series of the Adjacency Matrix of Features
Feature Selection (FS) is an important pre-processing step in machine learning and data mining. All the traditional feature selection methods assume that the entire feature space is available from the beginning. However, online streaming features (OSF) are an integral part of many real-world applications. In OSF, the number of training examples is fixed while the number of features grows with t...
متن کامل